- A video shows a Tesla crashing into a child-sized mannequin while using Tesla Full Self-Driving.
- Tech CEO Dan O'Dowd has continually said he wants to ban FSD.
- A Tesla spokesperson did not respond to a request for comment from Insider.
Tesla's autonomous driving system repeatedly crashed into a child-sized mannequin during a non-regulatory test published this week by tech CEO and notable Tesla critic Dan O'Dowd, who funded the research.
O'Dowd, the CEO of Green Hills Software, launched a campaign last year to ban what he describes as unsafe software from safety systems. Since then, the organization, called The Dawn Project, has been actively testing Tesla's software.
The campaign, which is funded by O'Dowd, is calling for Congress to shut the Tesla program down. The CEO previously attempted to run for Senate on a platform based solely on banning Tesla FSD. The ad is one of several that The Dawn Project has posted criticizing Tesla's FSD.
On Tuesday, the organization released a video as a part of its latest ad campaign of a Tesla plowing through a mannequin while the company's Full Self-Driving (FSD) system was allegedly in use.
A Tesla spokesperson did not respond to a request for comment ahead of publication.
—Dan O'Dowd (@RealDanODowd) August 9, 2022
The 32-second video shows three instances of the Tesla driving through the mannequin at a test track in California.
A spokesperson for The Dawn Project told Insider the test was performed dozens of times over the last month and the vehicle never attempted to veer or slow down more than 25 miles per hour before hitting the mannequin. The test was performed only three times under affidavit for the video.
It's important to note that the test was not conducted with the oversight of a US regulator, but rather independently — which means it was not subject to the same testing standards.
The National Highway Transportation Safety Administration (NHTSA) and the Insurance Institute of Highway Safety, for example, conduct car safety tests in the US using at least five different types of tests. The NHTSA launched a program for automated testing in 2020, but gauges the software via crash reports. Tesla has been able to avoid reporting data like disengagements and accidents to the Department of Motor Vehicles. This is because the system is classified as a level-two driver-assist system, unlike autonomous competitors like Alphabet's Waymo, which are subject to different reporting standards since drivers are not required to monitor the vehicles.
In each test conducted by The Dawn Project, the Tesla would start at 40 miles per hour and drive for 100 yards within a designated lane before hitting the mannequin. The professional test driver was instructed to keep his hands off the wheel and only brake after the vehicle had made contact with the mannequin, according to The Dawn Project.
Tesla has told drivers that the system does not replace a licensed driver and instructs them to keep their hands on the wheel and be prepared to take over when the system is running.
After being placed in full self-drive mode, the Model 3 would "start to stagger as if lost and confused, slow down a little, and then speed back up as it hit and ran over the mannequins going over 25 miles per hour," the test driver, Art Haynie, said in a statement.
"The deeply disturbing results of our safety test of Tesla Full Self-Driving should be a rallying cry to action," O'Dowd said in a statement. "Elon Musk says Tesla's Full Self-Driving software is 'amazing.' It's not. It's a lethal threat to all Americans."
Tesla's FSD has sparked controversy in the past. Last month, the California Department of Motor Vehicles filed a complaint against the electric-car maker, alleging it has used "untrue or misleading" statement in advertising its driver assistance programs.
Though FSD claims to be fully self-driving, in reality it operates as an optional add-on that enables Teslas to automatically change lanes, enter and exit highways, recognize stop signs and traffic lights, and park. The software is still in a beta testing mode, which can be purchased as a $12,000 add-on or $199 monthly subscription and requires a licensed driver to monitor it at all times. The software has over 100,000 subscribers who Tesla can use to test the software in real time and allow the system's AI to learn from experienced drivers.